We derive information-theoretic converses (i.e., lower bounds) for theminimum time required by any algorithm for distributed function computationover a network of point-to-point channels with finite capacity, where each nodeof the network initially has a random observation and aims to compute a commonfunction of all observations to a given accuracy with a given confidence byexchanging messages with its neighbors. We obtain the lower bounds oncomputation time by examining the conditional mutual information between theactual function value and its estimate at an arbitrary node, given theobservations in an arbitrary subset of nodes containing that node. The maincontributions include: 1) A lower bound on the conditional mutual informationvia so-called small ball probabilities, which captures the dependence of thecomputation time on the joint distribution of the observations at the nodes,the structure of the function, and the accuracy requirement. For linearfunctions, the small ball probability can be expressed by L\'evy concentrationfunctions of sums of independent random variables, for which tight estimatesare available that lead to strict improvements over existing lower bounds oncomputation time. 2) An upper bound on the conditional mutual information viastrong data processing inequalities, which complements and strengthens existingcutset-capacity upper bounds. 3) A multi-cutset analysis that quantifies theloss (dissipation) of the information needed for computation as it flows acrossa succession of cutsets in the network. This analysis is based on reducing ageneral network to a line network with bidirectional links and self-links, andthe results highlight the dependence of the computation time on the diameter ofthe network, a fundamental parameter that is missing from most of the existinglower bounds on computation time.
展开▼